A note on robust descent in differentiable optimization

نویسنده

  • Jean-Pierre Dussault
چکیده

In this note, we recall two solutions to alleviate the catastrophic cancellations that occur when comparing function values in descent algorithms. The automatic finite differencing approach [4] was shown useful to trust region and line search variants. The main original contribution is to successfully adapt the line search strategy [6] for use within trust region like algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Note on the Descent Property Theorem for the Hybrid Conjugate Gradient Algorithm CCOMB Proposed by Andrei

In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...

متن کامل

Aerodynamic Design Optimization Using Genetic Algorithm (RESEARCH NOTE)

An efficient formulation for the robust shape optimization of aerodynamic objects is introduced in this paper. The formulation has three essential features. First, an Euler solver based on a second-order Godunov scheme is used for the flow calculations. Second, a genetic algorithm with binary number encoding is implemented for the optimization procedure. The third ingredient of the procedure is...

متن کامل

Regularity Conditions for Non-Differentiable Infinite Programming Problems using Michel-Penot Subdifferential

In this paper we study optimization problems with infinite many inequality constraints on a Banach space where the objective function and the binding constraints are locally Lipschitz‎. ‎Necessary optimality conditions and regularity conditions are given‎. ‎Our approach are based on the Michel-Penot subdifferential.

متن کامل

Supervised Descent Method

In this dissertation, we focus on solving Nonlinear Least Squares problems using a supervised approach. In particular, we developed a Supervised Descent Method (SDM), performed thorough theoretical analysis, and demonstrated its effectiveness on optimizing analytic functions, and four other real-world applications: Inverse Kinematics, Rigid Tracking, Face Alignment (frontal and multi-view), and...

متن کامل

Non-differentiable Optimization of Fuzzy Logic Systems

In the present use of fuzzy logic systems tuning of their parameters has become an important issue. Many techniques, mainly based on the application of gradient descent, have been applied to this task generally in order to minimize a quadratic error function. The class of fuzzy logic systems using piecewise linear membership functions (e.g., triangular or trapezoidal) and/or minimum or maximum ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Oper. Res. Lett.

دوره 45  شماره 

صفحات  -

تاریخ انتشار 2017